Extended Lifted Inference with Joint Formulas
نویسندگان
چکیده
The First-Order Variable Elimination (FOVE) algorithm allows exact inference to be applied directly to probabilistic relational models, and has proven to be vastly superior to the application of standard inference methods on a grounded propositional model. Still, FOVE operators can be applied under restricted conditions, often forcing one to resort to propositional inference. This paper aims to extend the applicability of FOVE by providing two new model conversion operators: the first and the primary is joint formula conversion and the second is just-different counting conversion. These new operations allow efficient inference methods to be applied directly on relational models, where no existing efficient method could be applied hitherto. In addition, aided by these capabilities, we show how to adapt FOVE to provide exact solutions to Maximum Expected Utility (MEU) queries over relational models for decision under uncertainty. Experimental evaluations show our algorithms to provide significant speedup over the alternatives.
منابع مشابه
Generalized Counting for Lifted Variable Elimination
Lifted probabilistic inference methods exploit symmetries in the structure of probabilistic models to perform inference more efficiently. In lifted variable elimination, the symmetry among a group of interchangeable random variables is captured by counting formulas, and exploited by operations that handle such formulas. In this paper we generalize the structure of counting formulas and present ...
متن کاملLifted Probabilistic Inference with Counting Formulas
Lifted inference algorithms exploit repeated structure in probabilistic models to answer queries efficiently. Previous work such as de Salvo Braz et al.’s first-order variable elimination (FOVE) has focused on the sharing of potentials across interchangeable random variables. In this paper, we also exploit interchangeability within individual potentials by introducing counting formulas, which i...
متن کاملLower complexity bounds for lifted inference
One of the big challenges in the development of probabilistic relational (or probabilistic logical) modeling and learning frameworks is the design of inference techniques that operate on the level of the abstract model representation language, rather than on the level of ground, propositional instances of the model. Numerous approaches for such “lifted inference” techniques have been proposed. ...
متن کاملLifted Probabilistic Inference: An MCMC Perspective
The general consensus seems to be that lifted inference is concerned with exploiting model symmetries and grouping indistinguishable objects at inference time. Since first-order probabilistic formalisms are essentially template languages providing a more compact representation of a corresponding ground model, lifted inference tends to work especially well in these models. We show that the notio...
متن کاملLifted Inference for Relational Continuous Models
Relational Continuous Models (RCMs) represent joint probability densities over attributes of objects, when the attributes have continuous domains. With relational representations, they can model joint probability distributions over large numbers of variables compactly in a natural way. This paper presents a new exact lifted inference algorithm for RCMs, thus it scales up to large models of real...
متن کامل